A modified Newton method for rootfinding with cubic convergence

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Modified Newton-Type Method with Sixth-Order Convergence for Solving Nonlinear Equations

In this paper, we present and analyze a sixth-order convergent method for solving nonlinear equations. The method is free from second derivatives and permits f'(x)=0 in some points. It requires three evaluations of the given function and two evaluations of its derivative in each step. Some numerical examples illustrate that the presented method is more efficient and performs better than classic...

متن کامل

A Modified Newton Method for Minimization I

Some promising ideas for minimizing a nonlinear function, whose first and second derivatives are given, by a modified Newton method, were introduced by Fiacco and McCormick (Ref. 1). Unfortunately, in developing a method around these ideas, Fiacco and McCormick used a potentially unstable, or even impossible, matrix factorization. Using some recently developed techniques for factorizing an inde...

متن کامل

A Modified Orthant-Wise Limited Memory Quasi-Newton Method with Convergence Analysis

The Orthant-Wise Limited memory QuasiNewton (OWL-QN) method has been demonstrated to be very effective in solving the l1regularized sparse learning problem. OWL-QN extends the L-BFGS from solving unconstrained smooth optimization problems to l1-regularized (non-smooth) sparse learning problems. At each iteration, OWL-QN does not involve any l1regularized quadratic optimization subproblem and on...

متن کامل

Mesh-independent convergence of the modified inexact Newton method for a second order non-linear problem

In this paper, we consider an inexact Newton method applied to a second order nonlinear problem with higher order nonlinearities. We provide conditions under which the method has a mesh–independent rate of convergence. To do this, we are required to first, set up the problem on a scale of Hilbert spaces and second, to devise a special iterative technique which converges in a higher than first o...

متن کامل

Stochastic Variance-Reduced Cubic Regularized Newton Method

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an ( , √ )-approximately local minimum within Õ(n/ ) second-order oracl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Applied Mathematics

سال: 2003

ISSN: 0377-0427

DOI: 10.1016/s0377-0427(03)00391-1